Ugly Stupid Honest (╣)

An inquiry of machine poetics.

The current causality-based anthropocentric approach to designing architecture, always starts off with a hunch, a guess, a question or an intuition. We tinker, we try and we test with radically imperfect information. These subjective first steps are crucial in the design process and are essential to what will later entail as the architectural language, the style, the identity or the poetics.

This field guide is written as a collection of tales between me and my machine. This contribution is supplemented by a short film.








According to Mario Carpo, we have already entered the second digital turn, a new epoch, more mature than its precedent, where the first sparks of digital culture have been assimilated and a new understanding of humans’ role in defining the society we will be living in is being acknowledged: “...we are learning that machines can work better and faster, not when we subject them to our own modus operandi, but when we let them follow a different, nonhuman, post-scientific method; and we increasingly find it easier to let computers solve problems in their own way – even when we do not understand what they do or how they do it. In a metaphorical sense, computers are now developing their own science – a new kind of science.

The open interrogative is if, and to what extent, this new post-scientific method will become a replacement to the current design methodology. what would be the cost? what would be the benefit?

The second digital wave implies that we must accept human failure in an attempt to overcome the notion of nature as an object by his own means, illustrating how –quite ironically– the solution to the problem could be provided by one of the outcomes of the issue itself: computation. But do we really want that? why would we want to exclude failure and error? we should embrace the uncontrolled and the unexpected whilst dancing into the post-Anthropocene.

Now more than ever we should bring back the stupid, the ugly and the honest.



The rules of the framework.



1) Alles 1st Architektur (Hollein.1968).
2) All information is equal and is objectively true.
3) To experience means to function.

Rules of Thumb (I)

I only act as an operator.
I do not intervene in any way.

I do not think.
I do not complain.
I react.
I follow.
I measure.
I observe.
I follow instructions.
I am dependent.
I am mobile.
I listen.
I am passive.
I am the one who is guided.
I do not function without him.

I undergo.

I have a veto for Illegal trespassing.

Rules of Thumb (he)

I am the commander.

I think.
I perceive.
I instigate.
I act.
I inquire.
I decide.
I instruct.
I am dependent.
I am immobile.
I demand.
I am active.
I guide.
I do not function without him.

I do.

I give a direction every thirty minutes.



21/02/2020
13.45h ish
preface

I dragged him outside, almost headbutting a fellow passenger. The guy reacted disgruntled as I apologized in vain right before I saw the automatic door closing. As I reorganized and regained my composure, He already attracted the attention of most commuters. He didn't react though, as you could expect. I tried my best to explain that my French was not up to par and quickly excused me from nearby bystanders.
We continued our "grand tour" throughout the station's hallway which measured around 1m80 in height and was about 50meters long. Even without him it would be a challenge.
While I was contorting my back in ways I probably never had before, just to pass through the hallway. (You could argue if I was holding him upright, or he, me). He gave his first sign of life.
I heard his response but Could not comply accordingly. There was no passage to the left nor to the right. Despite, we still had to get out of the station in time. I think He was still disoriented. It was at this moment that I brought up my Moleskin to accept my first veto. I rest my case.

21/02/2020
16.00h ish
tale #1

"It's a weather station isn't it? shouted a man across the pond. the guy looked like he was in his seventies. He was wearing a matching checkerboard outfit and seemed quite astute.
"you could call it that., he's mobile you know. I made him myself" I shouted back.
"I used to work at the nearby weather station in Charleroi, he said. I don't believe in any forecast that predicts past the threshold of a week. I don't believe any of it. You cannot simulate the weather;
it doesn't work that way." He yelled back before pursuing his walk.

22/02/2020
14.45h ish
tale #2

"that's the thing they make google maps with, he must be an engineer of some kind, my nephew does the same thing" I could barely understand his murmuring.
His son looked quite apathetic while they rode away on their bicycles.

25/02/2020
11.30h ish
tale #3

"I've seen you around a few times now, what you are doing here?" She was sitting alone on her front porch with, what seemed to, be an alcoholic beverage. "I'm following my robot. He's supposed to lead me back home."
"you mean like a gps?" from her response, i could sense that she understood my mediocre French Fugazzi. "yes, well, he doesn't work very well. he's trying though. he can only make sounds. I've been following him for the past 4 days"
"And are you close?" she asked "I don't think so, to be fair, I'm not even sure where I am now." When I continued, I could only yearn for a sip of that drink.

28/04/2020
21.15h ish
tale #5

I pulled him over the broken cornstalks all the way to the top of the field. I reasoned it would be the best place to setup camp. I was lost, but got used to the feeling, it gave me comfort actually. I turned around and could clearly see the light pollution coming from the nearby highway, illuminating the sky and the surrounding biosphere. A radical reminder of an ecological crisis. I gently balanced him on his three wooden legs and started to prepare my tent for the night. The nights where spend in solitude. The rules where clear on this: all contact was permitted unless there was an emergency. Not that I wanted to, he would not fit into the tent anyways. The annoying sounds he made by day, slowly transgressed into comforting beeps during the night. He was always searching, that is how I designed him.

25/02/2020
8.00h ish
tale #4

"So" "...." I was resting for a brief moment at the bus stop in Zottegem. Storm Ciara wreaked havoc during the night and tempered with his electronics. The night before, I set up my tent upwind, trying to avoid the falling branches. It worked for the most part, but I didn't incalculate the rain flowing down from higher up. I think that's what got to him.
His measurements where still accurate but his communication seemed off. He started out as solid apparatus, but after 82 kilometers, he gradually transformed into an instable mess of 52 kilograms. I don't even blame him. At that moment I felt exactly the same.

"So, what is it? Are you an engineer?" I didn't even notice him. I was lost, and in more ways than just my thoughts.
"No, I am an architect and this is my robot." I answered, almost instinctively. "sure..." In hindsight, I don't think he believed me.



25/02/2020
8.00h ish
tale #6

“What is that?” The man asked while slowing down next to me. He seemed in his early thirties and was riding on one of those Fixie bikes.
“It's a robot, I answered.” Weary of the implied social distancing rules, he got of his bicycle to take a closer look.
“What does it do?” he asked intrigued.
“Well, he tries to make sense of the world around him. If you come into his view, you might influence his output.” I said, hopeful that he would. “What do you mean? ”
“Well, at the end of his journey he outputs little artifacts he classified during the day. He does this as a way to communicate back to me.”
“So, there is a laser scanning device that scans the environment?” I could tell he was interested in the same technological niche as I was, so I felt encouraged to tell him how this works. “No, I trained him to do so using a machine learning framework I made. Are you interested in how it works?”
“Well yes, I work at Robovision, we specialize in machine learning in commercial applications actually.”
“Well, the first step he undertakes is to build an object library. He detects objects through pattern generation and classifies them as such. He analyses the given live feed from the gopro camera and searches for recurring pattern formations. A car has four distinct wheels, a bicycle has two, a human has two arms etc. These elements are trained in an adversarial network using a pretrained existing database. In the beginning, when the library is sparse, so is the detection. The longer you travel, the more precise the system gets. Now the challenge lies in aligning this unstructured classification models. We can achieve this by 3D backpropagation. Basically, learning from itself and comparing existing data to see if he's evolving in the right direction.”
“But how do you train on 3D data?”
As you know, at the moment it is not possible to train a generative network on or with 3D geometry. Now I made a workaround that works as follows. I made an encoder that can translate a 3D object into his 2D representative image, or his geometry image. Let's decode a point as an example. A point as you know is the most primitive example of a 3D geometry in space. A point consists of five things. First, we have his position vector (0,0,2) Second, we have his pointindex (5). Third, we have his color information stored as a rgb unit vector (0.5,0.1,0.8) Fourth, we have his vertex information (the information which tells the point to which other point it has to connect to make polygons) and lastly we have his primitive type (mostly this a primitive of type "polygon").
Now, if we want to encode this point into a pixel, what can we do? The first thing we do is set the image resolution, then we can reference the pointindex to a given pixel on that image. In this case the pointindex five corresponds with the 5th pixel starting from the left to the right. To encode the position and color values, we can do this by mis-using the dynamic range of a 32-bit image by expressing position and color outside of the traditional 0-255 rgb color spectrum.

Then I can port these 32-bit images into an adversarial network I made. After training, I encode the image back into position, color and all other values until it becomes a 3D representative object of the training data.”
“Why would you bother making it mobile? Wouldn't it be way more effective if it would be a static machine?”
That is correct. However, I am not that interested by an accurate reading. My interest lies in the process that follows, full of distractions, and obstacles. He makes his own journey and I accompany him to his destination which is, in this case, my home in gent.
Valuable and invaluable data is being measured and integrated into his decision making all the time, both due to random and non-random environmental and physical distractions. Faulty measurements might get though which will influence his decisions.
Also, he makes for great conversations.
So how do you get to your destination?
Let's, as an example, predict his next move. Every thirty minutes, he checks his measurements and compares it to the previous location, and gives a direction vector as an output in the form of one beep (right) two beeps (left) or straight forward (silence)
The first thing he does is filter out the useful information from the useless. If the current information differs more than a given threshold, let's say 50%, from the previous measurement, that information will be discarded and will not be included in the final calculation.
An example could be: Let's assume that the anemometer (the instrument that measures wind direction) measures a unit vector with an x-component of 0.2 and a y-component of 0.9 which basically means the wind is going north. On the previous location the anemometer provided a unit vector of x =0.5 and y =-0.8 which means the wind was going south. This concludes that the current data will be discarded as it does not lie between the 50% threshold rate of the first unit vector. This process would happen with each measurement and each value type. Integer, float and vector more or less in similar ways.
When he has eliminated the useless information, he compares the current information and
then, his next step would be to fit the float and vector information between a value of 0 and 1. By doing this, he eliminates the hierarchy which would occur when higher values would have a bigger impact on his output. Each stop would have a value associated with it and while comparing it to A, his previous stop and B, his home location, he can draw a reference vector for which he now can decide if he should go left, right or keep going forward.
After your thesis is finished, you should come by. He said while giving his card. I kindly accepted.



A “car” model is generated through previous mentioned techniques. Every topological decision is argumented by a value and a concept.

This is a “geometry image” from the “car” geometry, generated by a custom trained neural network during my 1-day journey in Ghent. The implementation is based upon existing research from Hugues Hoppe. Instead of using uv-transformation as a segmentation value, I used the point segmentation value which allows me to fully automate the process without making any uv-seams.

Each pixel represents his own position, color information, vertex information and polygon information. By storing the vertex information, we are able to train on a “polygonal” representation instead of a “point” representation. Which means that the output can also be decoded into polygons and not just points.


© Joris Putteneers 2021
/
current project: Ugly, Stupid, Honest
project collaborator(s): Louise Bivort, Jef Boes
project year: 2020